An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy
نویسندگان
چکیده
We present an active-set method for minimizing an objective that is the sum of a convex quadratic and `1 regularization term. Unlike two-phase methods that combine a first-order active set identification step and a subspace phase consisting of a cycle of conjugate gradient iterations, the method presented here has the flexibility of computing one of three possible steps at each iteration: a relaxation step (that releases variables from the active set), a subspace minimization step based on the conjugate gradient iteration, and an active-set refinement step. The choice of step depends on the relative magnitudes of the components of the minimum norm subgradient. The paper establishes global rates of convergence, as well as work complexity estimates. Numerical results illustrating the behavior of the methods on four test sets are presented.
منابع مشابه
Solving A Fractional Program with Second Order Cone Constraint
We consider a fractional program with both linear and quadratic equation in numerator and denominator having second order cone (SOC) constraints. With a suitable change of variable, we transform the problem into a second order cone programming (SOCP) problem. For the quadratic fractional case, using a relaxation, the problem is reduced to a semi-definite optimization (SDO) program. The p...
متن کاملActive-set Methods for Submodular Minimization Problems
We consider the submodular function minimization (SFM) and the quadratic minimization problems regularized by the Lovász extension of the submodular function. These optimization problems are intimately related; for example, min-cut problems and total variation denoising problems, where the cut function is submodular and its Lovász extension is given by the associated total variation. When a qua...
متن کاملA Fast Active Set Block Coordinate Descent Algorithm for ℓ1-regularized least squares
The problem of finding sparse solutions to underdetermined systems of linear equations arises in several real-world problems (e.g. signal and image processing, compressive sensing, statistical inference). A standard tool for dealing with sparse recovery is the l1-regularized least-squares approach that has been recently attracting the attention of many researchers. In this paper, we describe an...
متن کاملA Globally Convergent Stabilized Sqp Method: Superlinear Convergence
Regularized and stabilized sequential quadratic programming (SQP) methods are two classes of methods designed to resolve the numerical and theoretical difficulties associated with ill-posed or degenerate nonlinear optimization problems. Recently, a regularized SQP method has been proposed that allows convergence to points satisfying certain second-order KKT conditions (SIAM J. Optim., 23(4):198...
متن کاملOn the convergence of an active-set method for ℓ1 minimization
Abstract. We analyze an abridged version of the active-set algorithm FPC AS proposed in [18] for solving the l1-regularized problem, i.e., a weighted sum of the l1-norm ‖x‖1 and a smooth function f(x). The active set algorithm alternatively iterates between two stages. In the first “nonmonotone line search (NMLS)” stage, an iterative first-order method based on “shrinkage” is used to estimate t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Optimization Methods and Software
دوره 30 شماره
صفحات -
تاریخ انتشار 2015